Web Survey Bibliography
Title The Effect of Respondent Commitment on Response Quality in Two Online Surveys
Author Cibelli Hibben, K.
Year 2017
Access date 08.09.2017
Abstract Answering questions completely, accurately and honestly is not always the top priority for survey respondents. To the extent that the inaccuracy in survey responses is due to insufficient effort by respondents, it might help to directly ask respondents to try harder and elicit an explicit agreement from them to do so. The rationale for this technique is that agreeing or stating one's intention to behave in a certain way commits a person to carry out the terms of the agreement. Charles Cannell and his associates pioneered this technique in the late1970s and the results were promising. Existing studies found that respondents in the commitment condition (vs. control) showed the following: significantly more mentions to open-ended items, number of health conditions, amount reported for food and drink consumed, higher mean score on reported precise-to-day index for health events, checking outside sources, and sensitive reporting (Oksenberg et al., 1977a; Oksenberg et al., 1977b). Similar results for commitment were observed in a telephone survey (Miller & Cannell, 1982). In an experimental web survey, Conrad et al. (under review) found commitment to improve response accuracy particularly among respondents with a college education or more (results for the lower education groups were not significant) and that only a very small percentage of respondents refused to make the commitment (1%).While promising, much of this research was conducted decades ago, in interviewer administered modes, with limited measures of data quality.
The proposed paper presents results from two web-based studies examining the effect commitment. The first study measures the effect of commitment – “yes” or “no” – in an online labor force survey. The experiment was embedded in a survey conducted by the Institute for Labor Market and Occupational Research (Institut für Arbeitsmarkt und Berurfsforschung (IAB)) in Germany fielded in December 2014 – January 2015. The second study measures the effect of asking respondents to commit to engaging in several specific response behaviors that seem likely promote data quality, such as reading the questions carefully, and trying to be as precise as possible, in an online survey of the parents of child patients at University of Michigan (UM) Health System. It was fielded in March – May 2016. Both studies examine the effect of commitment on response accuracy as verified in administrative records – previous studies evaluating commitment have only used indirect measures of accuracy – in addition to reducing satisficing behaviors, item nonresponse, and socially desirable reporting.
Both studies produced mixed results for the overall effect of commitment. However, in Study 1 there were some particularly promising results for those who committed versus those who were invited to commit but did not, and in Study 2 for those who committed to all of the requested response behaviors versus those who committed to engage in only a few. Overall, the results offer insights into the underlying level of motivation of web survey respondents, such as their willingness to look up information in records, and raise challenging practical questions about how such techniques might be used in production surveys.
The proposed paper presents results from two web-based studies examining the effect commitment. The first study measures the effect of commitment – “yes” or “no” – in an online labor force survey. The experiment was embedded in a survey conducted by the Institute for Labor Market and Occupational Research (Institut für Arbeitsmarkt und Berurfsforschung (IAB)) in Germany fielded in December 2014 – January 2015. The second study measures the effect of asking respondents to commit to engaging in several specific response behaviors that seem likely promote data quality, such as reading the questions carefully, and trying to be as precise as possible, in an online survey of the parents of child patients at University of Michigan (UM) Health System. It was fielded in March – May 2016. Both studies examine the effect of commitment on response accuracy as verified in administrative records – previous studies evaluating commitment have only used indirect measures of accuracy – in addition to reducing satisficing behaviors, item nonresponse, and socially desirable reporting.
Both studies produced mixed results for the overall effect of commitment. However, in Study 1 there were some particularly promising results for those who committed versus those who were invited to commit but did not, and in Study 2 for those who committed to all of the requested response behaviors versus those who committed to engage in only a few. Overall, the results offer insights into the underlying level of motivation of web survey respondents, such as their willingness to look up information in records, and raise challenging practical questions about how such techniques might be used in production surveys.
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - European survey research associaton conference 2017, ESRA, Lisbon (26)
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Nonprobability sampling as model construction; 2017; Mercer, A. W.